منابع مشابه
Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information
Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...
متن کاملOn Classification of Bivariate Distributions Based on Mutual Information
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...
متن کاملlong range health planning
in the past, health planning in iran has been carried out in the context of short-range economic plans. although this mechanism has helped a great deal in the achievement of certain health plans however, the said scheme has been short in meeting the health objectives on a comprehensive basis. most often, the heath programs have lost their values to the priority and cost effectiveness of economi...
متن کاملQuantifying synergistic mutual information
Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demon...
متن کاملThe kernel mutual information
indeed, we demonstrate that the KGV can also be thought of as We introduce a new contrast function, the kemel mutual information (KMIj, to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information betwee...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ACM SIGMETRICS Performance Evaluation Review
سال: 2008
ISSN: 0163-5999
DOI: 10.1145/1453175.1453181